Divergence measures based on the Shannon entropy
نویسندگان
چکیده
منابع مشابه
Divergence measures based on the Shannon entropy
A new class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are ...
متن کاملOn the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملGibbs-Shannon Entropy and Related Measures: Tsallis Entropy
In this research paper, it is proved that an approximation to Gibbs-Shannon entropy measure naturally leads to Tsallis entropy for the real parameter q =2 . Several interesting measures based on the input as well as output of a discrete memoryless channel are provided and some of the properties of those measures are discussed. It is expected that these results will be of utility in Information ...
متن کاملHow well do practical information measures estimate the Shannon entropy?
Estimating the entropy of finite strings has applications in areas such as event detection, similarity measurement or in the performance assessment of compression algorithms. This report compares a variety of computable information measures for finite strings that may be used in entropy estimation. These include Shannon’s n-block entropy, the three variants of the Lempel-Ziv production complexi...
متن کاملGeneralization of Entropy Based Divergence Measures for Symbolic Sequence Analysis
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 1991
ISSN: 0018-9448
DOI: 10.1109/18.61115